On Polynomial Time Methods for Exact Low Rank Tensor Completion
نویسندگان
چکیده
In this paper, we investigate the sample size requirement for exact recovery of a high order tensor of low rank from a subset of its entries. We show that a gradient descent algorithm with initial value obtained from a spectral method can, in particular , reconstruct a d × d × d tensor of multilinear ranks (r, r, r) with high probability from as few as O(r 7/2 d 3/2 log 7/2 d + r 7 d log 6 d) entries. In the case when the ranks r = O(1), our sample size requirement matches those for nuclear norm minimization (Yuan and Zhang, 2016a), or alternating least squares assuming orthogonal decompos-ability (Jain and Oh, 2014). Unlike these earlier approaches, however, our method is efficient to compute, easy to implement, and does not impose extra structures on the tensor. Numerical results are presented to further demonstrate the merits of the proposed approach.
منابع مشابه
Low-Rank Approximation and Completion of Positive Tensors
Unlike the matrix case, computing low-rank approximations of tensors is NP-hard and numerically ill-posed in general. Even the best rank-1 approximation of a tensor is NP-hard. In this paper, we use convex optimization to develop polynomial-time algorithms for low-rank approximation and completion of positive tensors. Our approach is to use algebraic topology to define a new (numerically well-p...
متن کاملParallel matrix factorization for low-rank tensor completion
Higher-order low-rank tensors naturally arise in many applications including hyperspectral data recovery, video inpainting, seismic data reconstruction, and so on. We propose a new model to recover a low-rank tensor by simultaneously performing low-rank matrix factorizations to the all-mode matricizations of the underlying tensor. An alternating minimization algorithm is applied to solve the mo...
متن کاملLow-Rank Matrix Completion with Adversarial Missing Entries
We give an algorithm for completing an order-m symmetric low-rank tensor from its multilinear entries in time roughly proportional to the number of tensor entries. We apply our tensor completion algorithm to the problem of learning mixtures of product distributions over the hypercube, obtaining new algorithmic results. If the centers of the product distribution are linearly independent, then we...
متن کاملStatistically Optimal and Computationally Efficient Low Rank Tensor Completion from Noisy Entries
In this article, we develop methods for estimating a low rank tensor from noisy observations on a subset of its entries to achieve both statistical and computational efficiencies. There have been a lot of recent interests in this problem of noisy tensor completion. Much of the attention has been focused on the fundamental computational challenges often associated with problems involving higher ...
متن کاملBeyond Low Rank: A Data-Adaptive Tensor Completion Method
Low rank tensor representation underpins much of recent progress in tensor completion. In real applications, however, this approach is confronted with two challenging problems, namely (1) tensor rank determination; (2) handling real tensor data which only approximately fulfils the low-rank requirement. To address these two issues, we develop a data-adaptive tensor completion model which explici...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1702.06980 شماره
صفحات -
تاریخ انتشار 2017